Assessing declarative and procedural knowledge using multiple-choice questions
نویسندگان
چکیده
A ssessment is an indispensable aspect of academic life a natural partner to curriculum and a central component in medical education (1). Aside from its core objective of gauging student performance, assessment is also crucial for documenting the attainment of learning outcomes, verifying medical competence and, ultimately, minimizing risks to patients arising from an unsatisfactory knowledge base (2). For ‘traditional’ written exams, it is crucially important to generate high-quality multiple-choice questions (MCQs) for formative and summative assessments (3). In both applications, MCQs should promote effective learning that, in turn, prepares trainees for safe and competent clinical practice. As such, there is an increasing need for MCQs that examine declarative and procedural knowledge, since both are essential in assessing knowledge from a different perspective. Declarative knowledge MCQs are defined as questions that assess ‘pure recall’ of specific isolated pieces of knowledge such as facts, definitions, terminologies, concepts, etc. (4). An example of a declarative knowledge MCQ will be: which organ of the body produces insulin? On the contrary, procedural knowledge MCQs are defined as questions that assess ‘problem-solving skills’ in understanding concepts, assembling knowledge from across varying scientific disciplines, making rational predictions, applying critical judgments, arriving at conclusions, and deciding on the best course of actions (4). An example of a procedural knowledge MCQ will be: a 55-year-old patient who presented with high-grade fever, significant weight loss of 15%, excessive night sweats, lower abdominal pain and massive volume of blood in stools. What is the best next step in the management of this patient? To properly process procedural knowledge in reaching conclusions or to make wise clinical judgments, medical students must have a solid foundation of declarative knowledge upon which to build (4). However, in achieving this, educators must minimize the ‘blunt’ short-term memorization of declarative knowledge, which neither results in actual learning nor clinically-useful application. In summary, MCQs tapping declarative knowledge should be given equal credence (with procedural knowledge MCQs) in curricular assessments. Reflecting current trends, formal assessments of procedural knowledge are important in honing problem-solving and critical-thinking skills (5), as well as nurturing the development of life-long learning. However, such procedural knowledge should itself be assessed on more than mere short-term knowledge retention focusing instead on the comprehension and long-term retrieval of scientific knowledge readily applicable to a range of problem-based, real-life clinical situations. Moreover, MCQs measuring procedural knowledge should encourage the use of higher-level cognition geared toward critical appraisal and problem-solving skills. With regards to MCQs targeting procedural knowledge, it is important to construct questions that explore a deeper understanding of basic science content which demand the use of higher-order thinking skills to integrate such knowledge with other scientific principles in vis-a-vis relevant, clinically-oriented contexts. These practices, in turn, promote longer-term retention of knowledge and prepare medical students for competent patient care. While crafting declarative knowledge MCQs is fairly simple, structuring procedural knowledge MCQs can be challenging (6). Using short-answer questions (SAQs) requiring problem-solving is one method that allows for the integration of basic and clinical science (7) and can explore students’ knowledge application, problem-solving skills, clinical reasoning abilities, critical appraisal capacities and management proficiencies in the context of clinically-based cases. Of course, as with the measurement of any construct, the use of multiple assessment methods is advisable (1).
منابع مشابه
Evaluation of Multiple Choice Questions Quality Trend as Structure and Taxonomy
Background: Evaluation of multiple-choice questions is a strategic activity and the most effective tool in educational system and improvement. In this study, the quality of some indexes of multiple-choice exams in Babol Para Medical faculty was investigated on the basis of structure and knowledge level distribution in the first semesters of 2007 and 2012. Methods: The Milman checklist was used ...
متن کاملThe Efficacy of Procedural and Declarative Learning Strategies on EFL Students’ Oral Proficiency
Style and strategies in EFL learning contexts and the effects of task types were explored to enhance language learning strategies. Using a quantitative pre-test, post-test design and interviews, this study investigated the effects of procedural and declarative learning strategies on EFL learners’ acquisition of English past tense performing narrative tasks. The participants were 396 male and fe...
متن کاملMedical knowledge for clinical problem solving: a structural analysis of clinical questions.
Despite technological advances that support wide-ranging access to and transfer of knowledge, practicing physicians continue to underutilize current biomedical literature. This paper explores the nature of clinically applicable medical knowledge through a structural analysis of clinical questions. The author analyzed a set of sixty questions, based on actual online search requests of practicing...
متن کاملPartial Knowledge in Multiple-Choice Testing
The intent of this study was to discover the nature of (partial) knowledge as estimated by the multiple-choice (MC) test method. An MC test of vocabulary, including 20 items, was given to 10 participants. Each examinee was required to think aloud while focusing on each item before and while making a response. After each test taker was done with each item, s/he was ...
متن کاملThe Impact of Correction for Guessing Formula on MC and Yes/No Vocabulary Tests' Scores
A standard correction for random guessing (cfg) formula on multiple-choice and Yes/Noexaminations was examined retrospectively in the scores of the intermediate female EFL learners in an English language school. The correctionwas a weighting formula for points awarded for correct answers,incorrect answers, and unanswered questions so that the expectedvalue of the increase in test score due to g...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره 18 شماره
صفحات -
تاریخ انتشار 2013